11 research outputs found

    Child's play: activity recognition for monitoring children's developmental progress with augmented toys

    Get PDF
    The way in which infants play with objects can be indicative of their developmental progress and may serve as an early indicator for developmental delays. However, the observation of children interacting with toys for the purpose of quantitative analysis can be a difficult task. To better quantify how play may serve as an early indicator, researchers have conducted retrospective studies examining the differences in object play behaviors among infants. However, such studies require that researchers repeatedly inspect videos of play often at speeds much slower than real-time to indicate points of interest. The research presented in this dissertation examines whether a combination of sensors embedded within toys and automatic pattern recognition of object play behaviors can help expedite this process. For my dissertation, I developed the Child'sPlay system which uses augmented toys and statistical models to automatically provide quantitative measures of object play interactions, as well as, provide the PlayView interface to view annotated play data for later analysis. In this dissertation, I examine the hypothesis that sensors embedded in objects can provide sufficient data for automatic recognition of certain exploratory, relational, and functional object play behaviors in semi-naturalistic environments and that a continuum of recognition accuracy exists which allows automatic indexing to be useful for retrospective review. I designed several augmented toys and used them to collect object play data from more than fifty play sessions. I conducted pattern recognition experiments over this data to produce statistical models that automatically classify children's object play behaviors. In addition, I conducted a user study with twenty participants to determine if annotations automatically generated from these models help improve performance in retrospective review tasks. My results indicate that these statistical models increase user performance and decrease perceived effort when combined with the PlayView interface during retrospective review. The presence of high quality annotations are preferred by users and promotes an increase in the effective retrieval rates of object play behaviors.Ph.D.Committee Chair: Starner, Thad E.; Committee Co-Chair: Abowd, Gregory D.; Committee Member: Arriaga, Rosa; Committee Member: Jackson, Melody Moore; Committee Member: Lukowicz, Paul; Committee Member: Rehg, James M

    GART: The Gesture and Activity Recognition Toolkit

    Get PDF
    Presented at the 12th International Conference on Human-Computer Interaction, Beijing, China, July 2007.The original publication is available at www.springerlink.comThe Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications

    ActionGSR: A combination galvanic skin response-accelerometer for physiological measurements in active environments

    No full text
    The galvanic skin response (GSR), also known as electrodermal response, measures changes in electrical resistance across two regions of the skin. Galvanic skin response can measure arousal levels in children with autism; however, the GSR signal may be overwhelmed by the vigorous movements of the children. This paper introduces ActionGSR, a wireless sensor capable of measuring both GSR and acceleration simultaneously in an attempt to disambiguate valid GSR signals from motion artifacts.

    Biometric Identification using Song-Based Blink Patterns

    No full text
    In this paper we describe a system that uses patterns of eye blinks as a biometric. Each user chooses a personal blink pattern based on a song (for example “Jingle Bells”). To establish their identity, the user looks at the system’s camera and blinks to the cadence of their chosen song. Computer vision is used to detect the blinks and to compare the blinked cadence to a database of stored blinked patterns to determine which song is being blinked, and therefore which user is performing the blinking. To make the system more secure, the system compares the characteristics of the blinking itself, the “blinkprint”, to the user’s stored blinkprint. This provides a verification check to help protect against the theft and subsequent use of a user’s blink pattern. We discuss the possible use of an enrollment process that alerts new users when their new blink code is similar to other codes already in the database, and report on the results of a preliminary experiment using this new enrollment procedure.

    Recognizing Soldier Activities in the Field

    No full text
    We describe the activity recognition componen
    corecore